20. Cross-Entropy 2

Cross-Entropy

So we're getting somewhere, there's definitely a connection between probabilities and error functions, and it's called Cross-Entropy. This concept is tremendously popular in many fields, including Machine Learning. Let's dive more into the formula, and actually code it!

Formula For Cross 1

CrossEntropy V1

Quiz: Coding Cross-entropy

Now, time to shine! Let's code the formula for cross-entropy in Python.

Start Quiz:

import numpy as np

# Write a function that takes as input two lists Y, P,
# and returns the float corresponding to their cross-entropy.
def cross_entropy(Y, P):
    pass
import numpy as np

def cross_entropy(Y, P):
    Y = np.float_(Y)
    P = np.float_(P)
    return -np.sum(Y * np.log(P) + (1 - Y) * np.log(1 - P))

User's Answer:

(Note: The answer done by the user is not guaranteed to be correct)

import numpy as np

# Write a function that takes as input two lists Y, P,
# and returns the float corresponding to their cross-entropy.
def cross_entropy(Y, P):
    my_sum = 0.0
    for i in range(len(Y)):
        my_sum += Y[i]*np.log(P[i])+(1-Y[i])*np.log(1-P[i])
        
    return -1.0*my_sum
import numpy as np

def cross_entropy(Y, P):
    Y = np.float_(Y)
    P = np.float_(P)
    return -np.sum(Y * np.log(P) + (1 - Y) * np.log(1 - P))